Information Distances versus Entropy Metric
نویسندگان
چکیده
Information distance has become an important tool in a wide variety of applications. Various types of information distance have been made over the years. These information distance measures are different from entropy metric, as the former is based on Kolmogorov complexity and the latter on Shannon entropy. However, for any computable probability distributions, up to a constant, the expected value of Kolmogorov complexity equals the Shannon entropy. We study the similar relationship between entropy and information distance. We also study the relationship between entropy and the normalized versions of information distances.
منابع مشابه
Cross Entropy Information Metric for Quantification and Cluster Analysis of Accents
This paper proposes a method for the measurement and quantification of the impact of accents on speech models. An accent metric is introduced based on the cross entropy (CE) of the probability models of speech from different accents. The CE metric has potentials for use in analysis, identification, quantification and ranking of the salient features of accents. The accent metric is used for phon...
متن کاملSeveral remarks on the metric space of genetic codes
A genetic code, the mapping from trinucleotide codons to amino acids, can be viewed as a partition on the set of 64 codons. A small set of non-standard genetic codes is known, and these codes can be mathematically compared by their partitions of the codon set. To measure distances between set partitions, this study defines a parameterised family of metric functions that includes Shannon entropy...
متن کاملEntropy of a semigroup of maps from a set-valued view
In this paper, we introduce a new entropy-like invariant, named Hausdorff metric entropy, for finitely generated semigroups acting on compact metric spaces from a set-valued view and study its properties. We establish the relation between Hausdorff metric entropy and topological entropy of a semigroup defined by Bis. Some examples with positive or zero Hausdorff metric entropy are given. Moreov...
متن کاملGeometric Entropy of Geodesic Currents on Free Groups
A geodesic current on a free group F is an F -invariant measure on the set ∂F of pairs of distinct points of ∂F . The space of geodesic currents on F is a natural companion of Culler-Vogtmann’s Outer space cv(F ) and studying them together yields new information about both spaces as well as about the group Out(F ). The main aim of this paper is to introduce and study the notion of geometric ent...
متن کاملInformation Geometry of Non-Equilibrium Processes in a Bistable System with a Cubic Damping
A probabilistic description is essential for understanding the dynamics of stochastic systems far from equilibrium, given uncertainty inherent in the systems. To compare different Probability Density Functions (PDFs), it is extremely useful to quantify the difference among different PDFs by assigning an appropriate metric to probability such that the distance increases with the difference betwe...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Entropy
دوره 19 شماره
صفحات -
تاریخ انتشار 2017